Kernel ridge vs. principal component regression: Minimax bounds and the qualification of regularization operators

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel ridge vs. principal component regression: Minimax bounds and the qualification of regularization operators

Regularization is an essential element of virtually all kernel methods for nonparametric regression problems. A critical factor in the effectiveness of a given kernel method is the type of regularization that is employed. This article compares and contrasts members from a general class of regularization techniques, which notably includes ridge regression and principal component regression. We d...

متن کامل

Kernel ridge vs. principal component regression: minimax bounds and adaptability of regularization operators

Regularization is an essential element of virtually all kernel methods for nonparametric regressionproblems. A critical factor in the effectiveness of a given kernel method is the type of regularizationthat is employed. This article compares and contrasts members from a general class of regularizationtechniques, which notably includes ridge regression and principal component reg...

متن کامل

Kernel Ridge Regression via Partitioning

In this paper, we investigate a divide and conquer approach to Kernel Ridge Regression (KRR). Given n samples, the division step involves separating the points based on some underlying disjoint partition of the input space (possibly via clustering), and then computing a KRR estimate for each partition. The conquering step is simple: for each partition, we only consider its own local estimate fo...

متن کامل

Kernel methods and regularization techniques for nonparametric regression: Minimax optimality and adaptation

Regularization is an essential element of virtually all kernel methods for nonparametric regression problems. A critical factor in the effectiveness of a given kernel method is the type of regularization that is employed. This article compares and contrasts members from a general class of regularization techniques, which notably includes ridge regression and principal component regression. We f...

متن کامل

Divide and conquer kernel ridge regression: a distributed algorithm with minimax optimal rates

We study a decomposition-based scalable approach to kernel ridge regression, and show that it achieves minimax optimal convergence rates under relatively mild conditions. The method is simple to describe: it randomly partitions a dataset of size N into m subsets of equal size, computes an independent kernel ridge regression estimator for each subset using a careful choice of the regularization ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Electronic Journal of Statistics

سال: 2017

ISSN: 1935-7524

DOI: 10.1214/17-ejs1258